- Title
- Ambient sensor fusion for virtual reality systems
- Creator
- Fountain, Jake Alexander
- Relation
- University of Newcastle Research Higher Degree Thesis
- Resource Type
- thesis
- Date
- 2019
- Description
- Research Doctorate - Doctor of Philosophy (PhD)
- Description
- Modern Virtual Reality (VR) systems rely on precise measurements of the real world to realistically present a virtual environment. Range, capability and accuracy are key tracking properties which determine the realism and applicability of a VR system. Integration of two or more sensor systems can enhance these properties compared to each individual component system. To support a rich ecosystem of diverse tracking devices for all levels of user competency, algorithms for ambient sensor fusion are required - algorithms which do not require user intervention or knowledge. This thesis investigates each of the following three steps required for ambient sensor fusion. First, Correlation involves identification of sensor dependencies and temporal relationships. Second, Calibration aligns sensor measurement domains. Finally, Fusion combines sensor data to extract an underlying statistical model. To address the correlation step, an algorithm for identifying rigid links between position and rotation sensors from different systems was developed and tested with several real world systems. Additionally, a novel model-less approach to determining latency between two sensor systems has shown promising results for use in comparison of arbitrary dependent signals. To address the second sensor fusion step, an ambient calibration technique has been developed which determines the relationship between two systems solely from self-directed user movement. The algorithm was applied to the Microsoft Kinect v2 and popular VR systems to enable body tracking in VR with commodity devices and minimal user setup. To address the final sensor fusion step, a modular multi-modal skeleton fusion algorithm was developed. The algorithm employs a novel constrained articulated Kalman filter to combine skeletal tracking results with high modularity in real time. To test the fusion system, the optical Leap Motion hand tracking system was fused with the inertial Perception Neuron hand tracking system. A user study was performed (n=18) and results suggested that the proposed system succeeds in generalising the component tracking systems to perform well in a wider variety of scenarios. Finally, the research software has been made available in an open source C++ plugin called ‘Spooky’. Spooky currently supports Unreal Engine 4. Future work will be focused on improving the reliability and usability of the Spooky framework, while extending the techniques developed for each of the fusion steps to broader applications.
- Subject
- virtual reality; ambient sensor fusion; calibration; Spooky framework
- Identifier
- http://hdl.handle.net/1959.13/1401166
- Identifier
- uon:34878
- Rights
- Copyright 2019 Jake Alexander Fountain
- Language
- eng
- Full Text
- Hits: 1515
- Visitors: 1876
- Downloads: 645
Thumbnail | File | Description | Size | Format | |||
---|---|---|---|---|---|---|---|
View Details Download | ATTACHMENT01 | Thesis | 13 MB | Adobe Acrobat PDF | View Details Download | ||
View Details Download | ATTACHMENT02 | Abstract | 612 KB | Adobe Acrobat PDF | View Details Download |